Transformations for variational factor analysis to speed up learning
نویسندگان
چکیده
We propose simple transformation of the hidden states in variational Bayesian factor analysis models to speed up the learning procedure. The speed-up is achieved by using proper parameterization of the posterior approximation which allows joint optimization of its individual factors, thus the transformation is theoretically justified. We derive the transformation formulae for variational Bayesian factor analysis and show experimentally that it can significantly improve the rate of convergence. The proposed transformation basically performs centering and whitening of the hidden factors taking into account the posterior uncertainties. Similar transformations can be applied to other variational Bayesian factor analysis models as well.
منابع مشابه
Fast, Large-Scale Transformation-Invariant Clustering
In previous work on “transformed mixtures of Gaussians” and “transformed hidden Markov models”, we showed how the EM algorithm in a discrete latent variable model can be used to jointly normalize data (e.g., center images, pitch-normalize spectrograms) and learn a mixture model of the normalized data. The only input to the algorithm is the data, a list of possible transformations, and the numbe...
متن کاملHow Neurofeedback could affect Working Memory and Processing Speed among Girl Students with Learning Disabilities
Background: Learning disabilities (LDs) are diagnosed in children impaired in the academic skills of reading, writing, and/or mathematics. Children with LDs usually exhibit a slower resting-state electroencephalogram (EEG), corresponding to a neurodevelopmental lag. The present study aimed to investigate the effectiveness of neurofeedback treatment on working memory and processing speed among g...
متن کاملVariational Iteration Method for Free Vibration Analysis of a Timoshenko Beam under Various Boundary Conditions
In this paper, a relatively new method, namely variational iteration method (VIM), is developed for free vibration analysis of a Timoshenko beam with different boundary conditions. In the VIM, an appropriate Lagrange multiplier is first chosen according to order of the governing differential equation of the boundary value problem, and then an iteration process is used till the desired accuracy ...
متن کاملSequential Learning of Layered Models from Video
A popular framework for the interpretation of image sequences is the layers or sprite model, see e.g. [1], [2]. Jojic and Frey [3] provide a generative probabilistic model framework for this task, but their algorithm is slow as it needs to search over discretized transformations (e.g. translations, or affines) for each layer simultaneously. Exact computation with this model scales exponentially...
متن کاملNatural Conjugate Gradient in Variational Inference
Variational methods for approximate inference in machine learning often adapt a parametric probability distribution to optimize a given objective function. This view is especially useful when applying variational Bayes (VB) to models outside the conjugate-exponential family. For them, variational Bayesian expectation maximization (VB EM) algorithms are not easily available, and gradient-based m...
متن کامل